1,613 research outputs found
21st Century Simulation: Exploiting High Performance Computing and Data Analysis
This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded
paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to
overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel
computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in
computing power. This has been characterized as a ten-year lead over the use of single-processor computers.
Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power.
JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The
challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant
populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants,
and to understand non-linear, asymmetric warfare. These requirements stretch both current
computational techniques and data analysis methodologies. In this paper, documented examples and potential
solutions will be advanced. The authors discuss the paths to successful implementation based on their experience.
Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch,
database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses.
The modeling and simulation community has significant potential to provide more opportunities for training and
analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more
realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights,
for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased
understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses.
The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the
beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success
High-performance computing enables simulations to transform education
This paper presents the case that education in the 21st
Century can only measure up to national needs if technologies developed in the simulation community, further
enhanced by the power of high performance computing,
are harnessed to supplant traditional didactic instruction. The authors cite their professional experiences in simulation, high performance computing and pedagogical studies to support their thesis that this implementation is not only required, it is feasible, supportable and affordable. Surveying and reporting on work in computer-aided education, this paper will discuss the pedagogical imperatives for group learning,
risk management and “hero teacher” surrogates, all being optimally delivered with entity level simulations of varying types. Further, experience and research is adduced to support the thesis that effective implementation of this level of simulation is enabled only by, and is largely dependent upon, high performance computing, especially by the ready utility and acceptable costs of Linux clusters
Palatini approach to Born-Infeld-Einstein theory and a geometric description of electrodynamics
The field equations associated with the Born-Infeld-Einstein action are
derived using the Palatini variational technique. In this approach the metric
and connection are varied independently and the Ricci tensor is generally not
symmetric. For sufficiently small curvatures the resulting field equations can
be divided into two sets. One set, involving the antisymmetric part of the
Ricci tensor , consists of the field equation for
a massive vector field. The other set consists of the Einstein field equations
with an energy momentum tensor for the vector field plus additional
corrections. In a vacuum with the field
equations are shown to be the usual Einstein vacuum equations. This extends the
universality of the vacuum Einstein equations, discussed by Ferraris et al.
\cite{Fe1,Fe2}, to the Born-Infeld-Einstein action. In the simplest version of
the theory there is a single coupling constant and by requiring that the
Einstein field equations hold to a good approximation in neutron stars it is
shown that mass of the vector field exceeds the lower bound on the mass of the
photon. Thus, in this case the vector field cannot represent the
electromagnetic field and would describe a new geometrical field. In a more
general version in which the symmetric and antisymmetric parts of the Ricci
tensor have different coupling constants it is possible to satisfy all of the
observational constraints if the antisymmetric coupling is much larger than the
symmetric coupling. In this case the antisymmetric part of the Ricci tensor can
describe the electromagnetic field, although gauge invariance will be broken.Comment: 12 page
Distributed and Interactive Simulations Operating at Large Scale for Transcontinental Experimentation
This paper addresses the use of emerging technologies to respond to the increasing needs for larger and more sophisticated agent-based simulations of urban areas. The U.S. Joint Forces Command has found it useful to seek out and apply technologies largely developed for academic research in the physical sciences. The use of these techniques in transcontinentally distributed, interactive experimentation has been shown to be effective and stable and the analyses of the data find parallels in the behavioral sciences. The authors relate their decade and a half experience in implementing high performance computing hardware, software and user inter-face architectures. These have enabled heretofore unachievable results. They focus on three advances: the use of general purpose graphics processing units as computing accelerators, the efficiencies derived from implementing interest managed routers in distributed systems, and the benefits of effective data management for the voluminous information
High-performance computing enables simulations to transform education
This paper presents the case that education in the 21st
Century can only measure up to national needs if technologies developed in the simulation community, further
enhanced by the power of high performance computing,
are harnessed to supplant traditional didactic instruction. The authors cite their professional experiences in simulation, high performance computing and pedagogical studies to support their thesis that this implementation is not only required, it is feasible, supportable and affordable. Surveying and reporting on work in computer-aided education, this paper will discuss the pedagogical imperatives for group learning,
risk management and “hero teacher” surrogates, all being optimally delivered with entity level simulations of varying types. Further, experience and research is adduced to support the thesis that effective implementation of this level of simulation is enabled only by, and is largely dependent upon, high performance computing, especially by the ready utility and acceptable costs of Linux clusters
Establishing Peer Recovery Support Services to Address the Central Appalachian Opioid Epidemic: The West Virginia Peers Enhancing Education, Recovery, and Survival (WV PEERS) Pilot Program
Introduction: Central Appalachia has been disproportionately affected by the opioid epidemic and overdose fatalities. We developed West Virginia Peers Enhancing Education, Recovery, and Survival (WV PEERS), a program based on peer recovery support, to engage individuals using opioids and link them with a range of services.
Methods: Community partners providing services to individuals with opioid use disorder (OUD) were identified and collaborations were formalized using a standardized memorandum of understanding. The program was structured to offer ongoing peer recovery support specialist (PRSS) services, not just a one-time referral. A website and cards describing the WV PEERS program were developed and disseminated via community partners and community education sessions.
Results: Overall, 1456 encounters with individuals with OUD (mean= 2 encounters per individual) occurred in a variety of community settings over 8 months. The majority of referrals were from harm reduction programs. Overall, 63.9% (n=931) of individuals served by WV PEERS accessed services for substance use disorders and/or mental health problems. Over half (52.3%; n = 487) of individuals entered substance use and/or mental health treatment, and nearly a third (30.4%; n = 283) remained in treatment over six months.
Implications: Using the WV PEERS model, PRSSs effectively engaged and linked individuals with OUD to mental health and substance use treatment in rural central Appalachia. Future research is needed to determine whether these services reduce the risk of overdose mortality
Reconstruction of the joint state of a two-mode Bose-Einstein condensate
We propose a scheme to reconstruct the state of a two-mode Bose-Einstein
condensate, with a given total number of atoms, using an atom interferometer
that requires beam splitter, phase shift and non-ideal atom counting
operations. The density matrix in the number-state basis can be computed
directly from the probabilities of different counts for various phase shifts
between the original modes, unless the beamsplitter is exactly balanced.
Simulated noisy data from a two-mode coherent state is produced and the state
is reconstructed, for 49 atoms. The error can be estimated from the singular
values of the transformation matrix between state and probability data.Comment: 4 pages (REVTeX), 5 figures (PostScript
Detection of vorticity in Bose-Einstein condensed gases by matter-wave interference
A phase-slip in the fringes of an interference pattern is an unmistakable
characteristic of vorticity. We show dramatic two-dimensional simulations of
interference between expanding condensate clouds with and without vorticity. In
this way, vortices may be detected even when the core itself cannot be
resolved.Comment: 3 pages, RevTeX, plus 6 PostScript figure
Recommended from our members
Macrostrain measurement using radial collimators at LANSCE
A series of `short` radial collimators have been implemented in the 90{degrees} scattering geometries on the neutron powder diffractometer at Los Alamos. The capability to perform macrostrain measurements has been improved by the commensurate ability to rapidly select a sampling volume appropriate to the specimen. The compact design of the collimators was dictated by the need to fit them in a cylindrical vacuum chamber as well as providing space in which to manipulate a specimen in three dimensions. Collimators of different vane lengths were fabricated to give 4 different resolutions for which 2/3 of the diffracted intensity comes form distances of 0.75, 1. 25, 2.5, and 4.0 mm along the incident beam. Qualifying scans and a demonstration of a cracked ring, containing a steep stress gradient, are included
Pumping two dilute gas Bose-Einstein condensates with Raman light scattering
We propose an optical method for increasing the number of atoms in a pair of
dilute gas Bose-Einstein condensates. The method uses laser-driven Raman
transitions which scatter atoms between the condensate and non-condensate atom
fractions. For a range of condensate phase differences there is destructive
quantum interference of the amplitudes for scattering atoms out of the
condensates. Because the total atom scattering rate into the condensates is
unaffected the condensates grow. This mechanism is analogous to that
responsible for optical lasing without inversion. Growth using macroscopic
quantum interference may find application as a pump for an atom laser.Comment: 4 pages, no figure
- …